# bfloat16 precision
Magsoup V1 12B
This is a result of merging multiple 12B-parameter-scale language models using the mergekit tool, with Magnolia-v3-12B as the base model
Large Language Model
Transformers

M
grimjim
59
3
Cydonia V1.3 Magnum V4 22B
Other
This is a 22B-parameter language model merged from Cydonia-22B-v1.3 and Magnum-v4-22B using the SLERP fusion method
Large Language Model
Transformers

C
knifeayumu
822
41
Cydonia V1.2 Magnum V4 22B
Other
A 22B-parameter language model merged from Cydonia-22B-v1.2 and Magnum-v4-22b using the SLERP method
Large Language Model
Transformers

C
knifeayumu
52
18
Meowgpt 3.5 11B
MeowGPT-3.5 is a merged model based on the GPT-3.5 architecture, focusing on text generation tasks.
Large Language Model
Transformers

M
Chickaboo
16
1
Eris LelantaclesV2 7b
This model is a hybrid obtained by merging the Eros-7b-test and Eris-Lelanacles-7b 7B parameter models using the SLERP method
Large Language Model
Transformers

E
ChaoticNeutrals
22
4
Featured Recommended AI Models